MuleSoft Accelerator for Consumer Goods
Use case 3 - Trade promotion effectiveness
Visualize your promotions, promotion tactics and account plans in one integrated platform with pre-built assets that integrate Business Intelligence Tools (BI Tools), such as CRM Analytics, into a TPM platform like Salesforce Consumer Goods Cloud.
Overview
- Description
- Solution definition
- Assumptions and constraints
- High-level architecture
- Activity Diagram
- Processing logic
- Successful outcome
See also
Description
To achieve successful trade promotion effectiveness, it is necessary to have a scalable integration system that can serve as a centralized platform for managing data, facilitating collaboration across different departments, automating manual processes, and offering visibility into the performance of trade promotions. With MuleSoft Accelerator for Consumer Goods, companies can effectively integrate systems to measure the impact of their promotions, make informed decisions on how to optimize them, and respond swiftly to changes in market conditions.
This use case unlocks critical data for analysis and allows key account managers and finance teams to run more efficient and data-driven trade promotions. This solution enables organizations to unify reporting and build visualizations to support their analysis of promotions, promotion tactics and account plans. With access to this information, account teams are better equipped to optimize trade spend ROI and drive sustainable growth.
Ultimately, consumer goods companies and their stakeholders will be able to have a holistic view of their trade spend.
Glossary
Term | Definition |
---|---|
CG | Consumer Goods |
CRMA | CRM Analytics |
OS | Object Store |
RTR | Real Time Reporting |
Hyperforce | Salesforce's public cloud architecture. |
Goals
- Support the extraction, transformation and loading of KPIs for Promotions, Promotion Tactics and Account Plans from Hyperforce into Salesforce CRM Analytics.
- Orchestrate the Hyperforce APIs to trigger the export job for extracting the CSV files for Promotions, Promotion Tactics and Account Plans data and export it to Salesforce CRM Analytics.
- Transform the .csv files generated by Hyperforce with dynamic KPI fields and upload the full and delta data load it to Salesforce CRM Analytics.
- Upload the Promotions, Promotion Tactics and Account Plans and corresponding Metric KPI as datasets to Salesforce CRM Analytics.
- Execute required Recipe API endpoints on Salesforce CRM Analytics to load accurate reports and dashboards.
- Ensure Hyperforce system is committed after each successful process.
Use case considerations
- Hyperforce System APIs will will work on top of [Export KPI feature of TPM] (https://developer.salesforce.com/docs/atlas.en-us.cgcloud_rtr_dev_guide.meta/cgcloud_rtr_dev_guide/rtr_integration_api_introduction.htm) and will orchestrate calls to different endpoints to export data from Hyperforce and integrate it to Mulesoft.
- Hyperforce System APIs triggers the export of full and delta load of files based on the incoming request.
Hyperforce System API supports the following:
- Schedule an export job for a given metaData, businessYear and salesOrg.
- Check the status of the job
- Get files related to the job
- Commit the job
A scheduler on the Data Normalization Process API schedules the export of jobs in Hyperforce via the the export KPI feature APIs.
- The Data Normalization Process API checks the status of the job that it scheduled.
- Upon the completion of the job in Hyperforce with the status as Ready, the Data Normalization Process API extracts the file from Hyperforce via the Hyperforce System API.
- The Data Normalization Process API publishes the files generated for Promotions, Promotion Tactics and Account Plans to an S3 bucket.
- AWS Lambda is used to extract and split files per configured records count. The splitting can happen based on configuration set up as an environment variable in AWS Lambda.
- The Data Normalization Process API
- Reads each file
- For the first file per every metaData/object, creates a KPI Metric file with the KPI columns in S3.
- Dynamically generates the metadata.json file
- Maps the file to CRM Analytics format
- Sends a request to [InsightsExternalData API] (/sobjects/InsightsExternalData) to create the dataset with the dataSetName (configured) and the metadata.json (base64 encoded format)
- If the file belongs to a full load job and is the first file, the operation OVERWRITE is executed on the InsightsExternalData API.
- During the processing of the first file for every object, KPI Metric information (column names of the file) is appended to the KPI Metric file in S3.
- If the file belongs to a delta load job,
- the operation INSERT is executed on the InsightsExternalData API for the records with Action as
i
- the operation UPDATE is executed on the InsightsExternalData API for the records with Action as
u
- the operation DELETE is executed on the InsightsExternalData API for the records with Action as
d
- the operation INSERT is executed on the InsightsExternalData API for the records with Action as
- The mapped file is then compressed in gzip and converted to base64 encoded format and uploaded as PartNumber via the InsightsExternalDataPart API - {{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalDataPart
- After the ingestion of all the files for the three objects (Promotions, PromotionTactic, AccountPlan), scheduler (schedulers_trigger-batch-job) in the Data Normalization Process API picks the KPI Metric file from S3 and uploads it to CRM Analytics.
- Once all the split files related to all the three businessObjects or metadata are completely ingested, the External Data Recipe API is triggered on CRM Analytics.
Technical considerations
- RTR Apex classes should be setup and deployed on the Salesforce CG Cloud.
- AWS Lambda function along with S3 buckets and folders should be setup as per the setup guide.
- Export configurations should be setup on Salesforce CG Cloud.
- Integration User with the necessary permissions should be created for MuleSoft integration with External Data Rest APIs.
- CRM Analytics Plus
- CRM Analytics Plus Admin
- CRM Analytics Platform Admin
- CRM Analytics licence should be enabled.
- Consumer Goods Cloud Intelligence license should be enabled on Salesforce CG Cloud.
End-to-end scenarios
- Promotion Measures KPIs created or updated in Hyperforce are reflected in Salesforce CRM Analytics.
- Promotion Tactics KPIs created or updated in Hyperforce are reflected in Salesforce CRM Analytics.
- Account Plans KPIs created or updated in Hyperforce are reflected in Salesforce CRM Analytics.
- KPI metric dataset including KPIs from Promotion, Promotion Tactic and Account Plan files are reflected in Salesforce CRM Analytics.
Solution definition
The primary set of use cases covered in the current solution involve the synchronisation of KPIs of Promotion Measures, Promotion Tactics and Account Plans data between Hyperforce and Salesforce CRM Analytics.
Goal
Integrate KPIs of Promotion Measures, Promotion Tactics and Account Plans data from Hyperforce to Salesforce CRM Analytics.
Main success scenario
- A scheduler is set up for each of the businessObject or metaData in the Data Normalization Process API.
- The Data Normalization Process API checks the Object Store to determine if any previous job with the same configuration of metaName, businessYear and salesOrg is still in InProgress status
- If there is an entry, it doesnt schedule a new job. If not, it does the following steps.
- The Data Normalization Process API invokes the Hyperforce System APIs to initiate the export jobs in Hyperforce to retrieve Promotions, Promotion Tactics and Account Plans based on a particular business year and sales org.
- Hyperforce System API will invoke the corresponding backend RTR Gateway APIs to trigger the export job.
- The Data Normalization Process API checks the status of the job for Ready status in the Hyperforce System API at a configured frequency.
- Once the status of the job is Ready, the Data Normalization Process API invokes the Hyperforce System API to get the file in .csv.gz format.
- The Data Normalization Process API writes the file to S3 in the inbox folder.
- A Lambda function setup in AWS reads the file from the inbox folder, extracts the .csv and splits it per configuration size count and writes it back to the inbox folder. For example, a 2 GB file from Hyperforce will be split into 200 files of 10 MB size each..
- The DataNormalization Process API
- Processes each file, extracts the column names for generating the KPI metric from the first file.
- Creates a KPI Metrics file in S3 for the first file.
- Generates the metadata file from the first file.
- Transforms data from Hyperforce format to CRM Analytics format for each file as per metadata configuration.
- Creates a dataset with the given name and metadata.json via the InsightsExternalData API ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalData)
- Compresses the mapped file to gZip format and converts it to base64
- Adds the data to the dataSet via the InsightsExternalDataPart API ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalDataPart)
- The data is uploaded as parts with
PartNumber
corresponding to the sequence of files. Each part has InsightsExternalDataId that is set to the header ID of the InsightsExternalData from step(e). The part numbers start with 1 contiguously. - For files related to the delta load, the records with Action i,u,d will be uploaded separately with actions corresponding to upsert and delete via the InsightsExternalData API ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalData)
- Executes PATCH operation on ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalData) with Action: Process to process the file.
After all the split files related to all the three businessObjects or metaDatas are processed, the Data Normalization Process API invokes the trigger recipe on CRM Analytics.
- Invokes the CRM Analytics Rest API (/services/data/v56.0/wave/folders) to get the result of all apps present in CRM Analytics.
- Checks the “templateSourceId” parameter for a value of “sfdc_internal__CG_TPE_Analytics”
- If it exists, checks the “applicationStatus” parameter for “completedstatus”. If it doesnt exist, the Data Normalization Process continues with Step 12.
- Extracts the “name” of the apps and store these relevant names
- Invokes to recipes endpoint (/services/data/v57.0/wave/recipes) and searches for recipes which starts with _CG_TPE_Workflow
- Validates the response to check the name matches with dataSetName configured in the property file. i. For Promotion dataset: $.recipeDefinition.nodes.LOAD_DATASET0.parameters.dataset.name ii. For Promotion Tactic dataset: $.recipeDefinition.nodes.LOAD_DATASET1.parameters.dataset.name iii. For Account Plan dataset: $.recipeDefinition.nodes.LOAD_DATASET17.parameters.dataset.name iv. For KPI List dataset: $.recipeDefinition.nodes.LOAD_DATASET14.parameters.dataset.name
- If all the four datasetNames match, extract the recipeId and targetdataflowId.
Triggers the recipes based on recipeId and targetdataflowId
The Data Normalization Process API invokes the commit job on the Hyperforce System API
- The Hyperforce System API sends a request to the backend RTR Gateway API to commit the job in Hyperforce.
- Until a job is committed in Hyperforce, the files related to that job are considered full load. Once a job is committed in Hyperforce, the files related to the job are considered delta.
Assumptions and constraints
The following is used to guide or constrain the solution design at a high level:
- CRM Analytics Limits.
- Maximum file size per external data uploads: 40 GB
- Maximum file size for all external data uploads in a rolling 24-hour period: 50 GB
- Maximum number of external data jobs per dataset that can be run in a rolling 24-hour period: 50
- Maximum number of characters in a field: 32,000
- Maximum number of fields in a record: 5,000 (including up to 1,000 date fields)
- Maximum number of characters for all fields in a record: 400,000
- The solution currently supports one salesOrg per businessObject.
- No spaces are allowed in the dataSetName in Salesforce CRM Analytics.
High-level architecture
Activity diagram
The diagram below illustrates the sequence of processing the Promotion Measures and Promotion Tactics to Salesforce CRM Analytics from Hyperforce.
Processing logic
The primary handling and orchestration of Promotion Tactics and Promotion Measures will be implemented in the Data Normalization Process API. This process can be described as follows:
- A scheduler is set up for each of the businessObject or metaData in the Data Normalization Process API.
- The Data Normalization Process API checks the Object Store to determine if any previous job with the same configuration of metaName, businessYear and salesOrg is still in InProgress status
- If there is an entry, it doesnt schedule a new job. If not, it does the following steps.
- The Data Normalization Process API invokes the Hyperforce System APIs to initiate the export jobs in Hyperforce to retrieve Promotions, Promotion Tactics and Account Plans based on a particular business year and sales org.
- Hyperforce System API will invoke the corresponding backend RTR Gateway APIs to trigger the export job.
- The Data Normalization Process API checks the status of the job for Ready status in the Hyperforce System API at a configured frequency.
- Once the status of the job is Ready, the Data Normalization Process API invokes the Hyperforce System API to get the file in .csv.gz format.
- The Data Normalization Process API writes the file to S3 in the inbox folder.
- A Lambda function setup in AWS reads the file from the inbox folder, extracts the .csv and splits it per configuration size count and writes it back to the inbox folder. For example, a 2 GB file from Hyperforce will be split into 200 files of 10 MB size each.
The DataNormalization Process API
- Processes each file, extracts the column names for generating the KPI metric from the first file.
- Creates a KPI Metrics file in S3 for the first file.
- Generates the metadata file from the first file.
- Transforms data from Hyperforce format to CRM Analytics format for each file as per metadata configuration.
- Creates a dataset with the given name and metadata.json via the InsightsExternalData API ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalData)
- Compresses the mapped file to gZip format and converts it to base64
- Adds the data to the dataSet via the InsightsExternalDataPart API ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalDataPart)
- The data is uploaded as parts with
PartNumber
corresponding to the sequence of files. Each part has InsightsExternalDataId that is set to the header ID of the InsightsExternalData from step(e). The part numbers start with 1 contiguously. - For files related to the delta load, the records with Action i,u,d will be uploaded separately with actions corresponding to upsert and delete via the InsightsExternalData API ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalData)
- Executes PATCH operation on ({{_endpoint}}/services/data/v{{version}}/sobjects/InsightsExternalData) with Action: Process to process the file.
After all the split files related to all the three businessObjects or metaDatas are processed, the Data Normalization Process API invokes the trigger recipe on CRM Analytics.
- Invokes the CRM Analytics Rest API (/services/data/v56.0/wave/folders) to get the result of all apps present in CRM Analytics.
- Checks the “templateSourceId” parameter for a value of “sfdc_internal__CG_TPE_Analytics”
- If it exists, checks the “applicationStatus” parameter for “completedstatus”. If it doesnt exist, the Data Normalization Process continues with Step 12.
- Extracts the “name” of the apps and store these relevant names
- Invokes to recipes endpoint (/services/data/v57.0/wave/recipes) and searches for recipes which starts with _CG_TPE_Workflow
- Validates the response to check the name matches with dataSetName configured in the property file. i. For Promotion dataset: $.recipeDefinition.nodes.LOAD_DATASET0.parameters.dataset.name ii. For Promotion Tactic dataset: $.recipeDefinition.nodes.LOAD_DATASET1.parameters.dataset.name iii. For Account Plan dataset: $.recipeDefinition.nodes.LOAD_DATASET17.parameters.dataset.name iv. For KPI List dataset: $.recipeDefinition.nodes.LOAD_DATASET14.parameters.dataset.name
- If all the four datasetNames match, extract the recipeId and targetdataflowId.
- Triggers the recipes based on recipeId and targetdataflowId
- The Data Normalization Process API invokes the commit job on the Hyperforce System API
- The Hyperforce System API sends a request to the backend RTR Gateway API to commit the job in Hyperforce.
- Until a job is committed in Hyperforce, the files related to that job are considered full load. Once a job is committed in Hyperforce, the files related to the job are considered delta.
Successful outcome
After completing the update processing for all target systems successfully, the following conditions will be met:
- Promotion Measures KPIs from a particular year and salesOrg in the Salesforce Consumer Goods Cloud are synchronized with Salesforce CRM Analytics.
- Promotion Tactics KPIs from a particular year and salesOrg in the Salesforce Consumer Goods Cloud are synchronized with Salesforce CRM Analytics.
- Account Plans KPIs from a particular year and salesOrg in the Salesforce Consumer Goods Cloud are synchronized with Salesforce CRM Analytics.
Systems involved
- Hyperforce
- CRM Analytics
Setup instructions
Before you begin
The Getting Started with MuleSoft Accelerators guide provides general information on getting started with the accelerator components. This includes instructions on setting up your local workstation for configuring and deploying the applications. |
Downloadable assets
Accelerator System APIs
RCG Hyperforce System API | API Specification | Implementation Template
RCG CRM Analytics System API | API Specification | Implementation Template
Accelerator Process APIs
- RCG Data Normalization Process API | API Specification | Implementation Template
Custom Components
- RCG AWS Lambda Unzip and Split Files RCG AWS Lambda Unzip and Split Files - Source